Supplementary Material: Sequential Inference for Deep Gaussian Process

نویسندگان

  • Yali Wang
  • Marcus Brubaker
  • Brahim Chaib-draa
  • Raquel Urtasun
چکیده

In this section we briefly review sparse online GPs (GPso) [1, 2]. The key idea is to learn GPs recursively by updating the posterior mean and covariance of the training set {(x, y)}n=1 in a sequential fashion. This online procedure is coupled with a sparsification mechanism in which a fixed-size subset of the training set (called the active set) is iteratively selected to avoid the unbounded computation growth of updating.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential Inference for Deep Gaussian Process

A deep Gaussian process (DGP) is a deep network in which each layer is modelled with a Gaussian process (GP). It is a flexible model that can capture highly-nonlinear functions for complex data sets. However, the network structure of DGP often makes inference computationally expensive. In this paper, we propose an efficient sequential inference framework for DGP, where the data is processed seq...

متن کامل

Gaussian Process Regression Networks Supplementary Material

In this supplementary material, we discuss some further details of our ESS and VB inference (Sections 1 and 2), the computational complexity of our inference procedures (Section 3), and the correlation structure induced by the GPRN model (Section 4). We also discuss multimodality in the GPRN posterior (Section 5), SVLMC, and some background information and notation for Gaussian process regressi...

متن کامل

Supplementary Material: Memoized Online Variational Inference for Dirichlet Process Mixture Models

This document contains supplementary mathematics and algorithm descriptions to help readers understand our new learning algorithm. First, in Sec. 1 we offer detailed model description and update equations for a DP-GMM with zero-mean, full-covariance Gaussian likelihood. Second, in Sec. 2 we provide step-by-step discussion of our birth move algorithm, providing a level-of-detail at which the int...

متن کامل

Recurrent Gaussian Processes

We define Recurrent Gaussian Processes (RGP) models, a general family of Bayesian nonparametric models with recurrent GP priors which are able to learn dynamical patterns from sequential data. Similar to Recurrent Neural Networks (RNNs), RGPs can have different formulations for their internal states, distinct inference methods and be extended with deep structures. In such context, we propose a ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016